Competing with Gaussian linear experts

نویسندگان

  • Fedor Zhdanov
  • Vladimir Vovk
چکیده

We study the problem of online regression. We do not make any assumptions about input vectors or outcomes. We prove a theoretical bound on the square loss of Ridge Regression. We also show that Bayesian Ridge Regression can be thought of as an online algorithm competing with all the Gaussian linear experts. We then consider the case of infinite-dimensional Hilbert spaces and prove relative loss bounds for the popular non-parametric kernelized Bayesian Ridge Regression and kernelized Ridge Regression. Our main theoretical guarantees have the form of equalities.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Thompson Sampling for Online Learning with Linear Experts

In this note, we present a version of the Thompson sampling algorithm for the problem of online linear generalization with full information (i.e., the experts setting), studied by Kalai and Vempala, 2005. The algorithm uses a Gaussian prior and time-varying Gaussian likelihoods, and we show that it essentially reduces to Kalai and Vempala’s Follow-thePerturbed-Leader strategy, with exponentiall...

متن کامل

Parameter Estimation in Spatial Generalized Linear Mixed Models with Skew Gaussian Random Effects using Laplace Approximation

 Spatial generalized linear mixed models are used commonly for modelling non-Gaussian discrete spatial responses. We present an algorithm for parameter estimation of the models using Laplace approximation of likelihood function. In these models, the spatial correlation structure of data is carried out by random effects or latent variables. In most spatial analysis, it is assumed that rando...

متن کامل

Variational Mixture of Gaussian Process Experts

Mixture of Gaussian processes models extended a single Gaussian process with ability of modeling multi-modal data and reduction of training complexity. Previous inference algorithms for these models are mostly based on Gibbs sampling, which can be very slow, particularly for large-scale data sets. We present a new generative mixture of experts model. Each expert is still a Gaussian process but ...

متن کامل

Numerical solution of second-order stochastic differential equations with Gaussian random parameters

In this paper, we present the numerical solution of ordinary differential equations (or SDEs), from each order especially second-order with time-varying and Gaussian random coefficients. We indicate a complete analysis for second-order equations in special case of scalar linear second-order equations (damped harmonic oscillators with additive or multiplicative noises). Making stochastic differe...

متن کامل

Experts Combination through Density Decomposition

This paper is concerned with an important issue in Statistics and Artiicial Intelligence, which is problem decomposition and experts (or predictors) combination. Decomposition methods usually adopt a divide-and-conquer strategy which decomposes the initial problem into simple sub-problems. The global expert is then obtained from some combination of the local experts. In the case of Hard decompo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/0910.4683  شماره 

صفحات  -

تاریخ انتشار 2009